40 research outputs found

    Some modifications to the SNIP journal impact indicator

    Get PDF
    The SNIP (source normalized impact per paper) indicator is an indicator of the citation impact of scientific journals. The indicator, introduced by Henk Moed in 2010, is included in Elsevier's Scopus database. The SNIP indicator uses a source normalized approach to correct for differences in citation practices between scientific fields. The strength of this approach is that it does not require a field classification system in which the boundaries of fields are explicitly defined. In this paper, a number of modifications that will be made to the SNIP indicator are explained, and the advantages of the resulting revised SNIP indicator are pointed out. It is argued that the original SNIP indicator has some counterintuitive properties, and it is shown mathematically that the revised SNIP indicator does not have these properties. Empirically, the differences between the original SNIP indicator and the revised one turn out to be relatively small, although some systematic differences can be observed. Relations with other source normalized indicators proposed in the literature are discussed as well

    A bibliometric classificatory approach for the study and assessment of research performance at the individual level: the effects of age on productivity and impact

    Get PDF
    This paper sets forth a general methodology for conducting bibliometric analyses at the micro level. It combines several indicators grouped into three factors or dimensions which characterize different aspects of scientific performance. Different profiles or “classes” of scientists are described according to their research performance in each dimension. A series of results based on the findings from the application of this methodology to the study of CSIC scientists in Spain in three thematic areas are presented. Special emphasis is made on the identification and description of top scientists from structural and bibliometric perspectives. The effects of age on the productivity and impact of the different classes of scientists are analyzed. The classificatory approach proposed herein may prove a useful tool in support of research assessment at the individual level and for exploring potential determinants of research success.Peer reviewe

    Open Access uptake by universities worldwide

    Get PDF
    The implementation of policies promoting the adoption of an open science (OS) culture must be accompanied by indicators that allow monitoring the uptake of such policies and their potential effects on research publishing and sharing practices. This study presents indicators of open access (OA) at the institutional level for universities worldwide. By combining data from Web of Science, Unpaywall and the Leiden Ranking disambiguation of institutions, we track OA coverage of universities\u27 output for 963 institutions. This paper presents the methodological challenges, conceptual discrepancies and limitations and discusses further steps needed to move forward the discussion on fostering OA and OS practices and policies

    Towards a new crown indicator: Some theoretical considerations

    Get PDF
    The crown indicator is a well-known bibliometric indicator of research performance developed by our institute. The indicator aims to normalize citation counts for differences among fields. We critically examine the theoretical basis of the normalization mechanism applied in the crown indicator. We also make a comparison with an alternative normalization mechanism. The alternative mechanism turns out to have more satisfactory properties than the mechanism applied in the crown indicator. In particular, the alternative mechanism has a so-called consistency property. The mechanism applied in the crown indicator lacks this important property. As a consequence of our findings, we are currently moving towards a new crown indicator, which relies on the alternative normalization mechanism

    Severe Language Effect in University Rankings: Particularly Germany and France are wronged in citation-based rankings

    Get PDF
    We applied a set of standard bibliometric indicators to monitor the scientific state-of-arte of 500 universities worldwide and constructed a ranking on the basis of these indicators (Leiden Ranking 2010). We find a dramatic and hitherto largely underestimated language effect in the bibliometric, citation-based measurement of research performance when comparing the ranking based on all Web of Science (WoS) covered publications and on only English WoS covered publications, particularly for Germany and France.Comment: Short communication, 3 pages, 4 figure

    Rivals for the crown: Reply to Opthof and Leydesdorff

    Get PDF
    We reply to the criticism of Opthof and Leydesdorff [arXiv:1002.2769] on the way in which our institute applies journal and field normalizations to citation counts. We point out why we believe most of the criticism is unjustified, but we also indicate where we think Opthof and Leydesdorff raise a valid point

    Measuring Open Access uptake: Data sources, expectations, and misconceptions

    Get PDF
    In this paper we briefly introduce the concept of Open Access and review the many variants that have been presented in the literature. We then critically examine how OA variants are presented by data source and how they are operationalized in practice. The goal of the paper is to provide a set of guidelines on how to effectively interpret OA information. For this, we compare OA figures reported in different data sources at the institutional and journal level and dig into the potential explanations behind the differences observed on the figures each source provides. Policy highlights: 1) Open Access reporting in bibliometric reports is now possible due the proliferation of data sources which now provide information on the OA status of publications. 2) Unpaywall has become the main primary source on OA metadata for publications for the main bibliometric databases, however there are divergences on how this is reported and showed by each of them. 3) Understanding how OA variants are defined by each source and later operationalized is key to correctly report and interpret Open Access uptak
    corecore